75 research outputs found

    Socio-cyber-physical systems’ threats classifier

    Get PDF
    Summary: The article considers a new approach to the formation of a threat classifier in socio-cyber-physical systems. These systems, as a rule, are complex and based on the synthesis of cyber-physical components using smart technologies and social networks. It is important to note that these systems also belong to critical infrastructure objects, which requires a new approach to the creation of multi-contour security systems. An important task for the formation of the security contour of such systems is the development of a threat classifier that allows to objectively assess the criticality of the organization's infrastructure elements. The presented classifier allows to define an expert approach at the initial stage for establishing weighting factors of the influence of threats, such as anomalies, deviations from normal functioning and computer incidents. At the next stage, the characteristics of the impact of threats on the platforms of socio-cyber-physical systems, as well as their impact on the external and internal aspects of the system, are determined. The influence of social engineering methods, which can significantly increase the risk of threat implementation and create various channels for their implementation, including mixed (targeted) attacks, is considered in detail. On the basis of the proposed approach to the classification of threats, a method of assessing the current state of the level of security of socio-cyber-physical systems and the possibilities of determining the critical points of the system infrastructure is proposed. Countermeasures and the ability of multi-loop security systems to provide effective infrastructure protection are also discusse

    Researching a machine learning algorithm for a face recognition system

    Get PDF
    This article investigated the problem of using machine learning algorithms to recognize and identify a user in a video sequence. The scientific novelty lies in the proposed improved Viola-Jones method, which will allow more efficient and faster recognition of a person's face. The practical value of the results obtained in the work is determined by the possibility of using the proposed method to create systems for human face recognition. A review of existing methods of face recognition, their main characteristics, architecture and features was carried out. Based on the study of methods and algorithms for finding faces in images, the Viola-Jones method, wavelet transform and the method of principal components were chosen. These methods are among the best in terms of the ratio of recognition efficiency and work speed. Possible modifications of the Viola-Jones method are presented. The main contribution presented in this article is an experimental study of the impact of various types of noise and the improvement of company security through the development of a computer system for recognizing and identifying users in a video sequence. During the study, the following tasks were solved: – a model of face recognition is proposed, that is, the system automatically detects a person's face in the image (scanned photos or video materials); – an algorithm for analyzing a face is proposed, that is, a representation of a person's face in the form of 68 modal points; – an algorithm for creating a digital fingerprint of a face, which converts the results of facial analysis into a digital code; – development of a match search module, that is, the module compares the faceprint with the database until a match is foun

    IMPROVEMENT OF PROJECT RISK ASSESSMENT METHODS OF IMPLEMENTATION OF AUTOMATED INFORMATION COMPONENTS OF NON-COMMERCIAL ORGANIZATIONAL AND TECHNICAL SYSTEMS

    Get PDF
    The results of a study using the methodological apparatus of the theory of fuzzy logic and automation tools for analyzing input data for risk assessment of projects for the implementation of automated information components of organizational and technical systems are presented. Based on the model of logistics projects for motor transport units, the method for assessing the risks of projects implementing automated information components of non-commercial organizational and technical systems has been improved. To do this, let’s analyze the peculiarities of implementing ERP projects as commercial ones and investigate the specifics of the activities of state institutions, when successful tasks, and not economic indicators, lay the foundation for the assessment. It is considered that it is possible to formulate a system of risk assessment indicators for reducing the effectiveness of projects for implementing automated information systems in non-commercial organizational and technical systems. A meaningful interpretation of the fuzzy approach is carried out regarding the formalization of the risk assessment process for projects of automated information systems of public institutions. A tree of fuzzy inference is constructed based on the results of a study of the description of indicators and expert assessments on the risk assessment of the implementation of the project of such an automated information system. The improved method differs from the known ones by the use of hierarchical fuzzy inference, which makes it possible to quantify, reduce the time to evaluate project risks and improve the quality of decisions. An increase in the number of input variables leads to an increase in complexity (an increase in the number of rules) for constructing a fuzzy inference system. The construction of a hierarchical system of fuzzy inference and knowledge bases can reduce complexity (the number of rules). The development of a software module based on the algorithm of the method as part of corporate automated information systems of non-commercial organizational and technical systems will reduce the time for risk assessment of projects for the implementation of automated information systems

    IMPROVEMENT OF PROJECT RISK ASSESSMENT METHODS OF IMPLEMENTATION OF AUTOMATED INFORMATION COMPONENTS OF NON-COMMERCIAL ORGANIZATIONAL AND TECHNICAL SYSTEMS

    Get PDF
    The results of a study using the methodological apparatus of the theory of fuzzy logic and automation tools for analyzing input data for risk assessment of projects for the implementation of automated information components of organizational and technical systems are presented. Based on the model of logistics projects for motor transport units, the method for assessing the risks of projects implementing automated information components of non-commercial organizational and technical systems has been improved. To do this, let’s analyze the peculiarities of implementing ERP projects as commercial ones and investigate the specifics of the activities of state institutions, when successful tasks, and not economic indicators, lay the foundation for the assessment. It is considered that it is possible to formulate a system of risk assessment indicators for reducing the effectiveness of projects for implementing automated information systems in non-commercial organizational and technical systems. A meaningful interpretation of the fuzzy approach is carried out regarding the formalization of the risk assessment process for projects of automated information systems of public institutions. A tree of fuzzy inference is constructed based on the results of a study of the description of indicators and expert assessments on the risk assessment of the implementation of the project of such an automated information system.The improved method differs from the known ones by the use of hierarchical fuzzy inference, which makes it possible to quantify, reduce the time to evaluate project risks and improve the quality of decisions. An increase in the number of input variables leads to an increase in complexity (an increase in the number of rules) for constructing a fuzzy inference system. The construction of a hierarchical system of fuzzy inference and knowledge bases can reduce complexity (the number of rules). The development of a software module based on the algorithm of the method as part of corporate automated information systems of non-commercial organizational and technical systems will reduce the time for risk assessment of projects for the implementation of automated information systems

    MODELING THE PROTECTION OF PERSONAL DATA FROM TRUST AND THE AMOUNT OF INFORMATION ON SOCIAL NETWORKS

    Get PDF
    The article analyzes the parameters of social networks. The analysis is performed to identify critical threats. Threats may lead to leakage or damage to personal data. The complexity of this issue lies in the ever-increasing volume of data. Analysts note that the main causes of incidents in Internet resources are related to the action of the human factor, the mass hacking of IoT devices and cloud services. This problem is especially exacerbated by the strengthening of the digital humanistic nature of education, the growing role of social networks in human life in general. Therefore, the issue of personal information protection is constantly growing. To address this issue, let’s propose a method of assessing the dependence of personal data protection on the amount of information in the system and trust in social networks. The method is based on a mathematical model to determine the protection of personal data from trust in social networks. Based on the results of the proposed model, modeling was performed for different types of changes in confidence parameters and the amount of information in the system. As a result of mathematical modeling in the MatLab environment, graphical materials were obtained, which showed that the protection of personal data increases with increasing factors of trust in information. The dependence of personal data protection on trust is proportional to other data protection parameters. The protection of personal data is growing from growing factors of trust in information. Mathematical modeling of the proposed models of dependence of personal data protection on trust confirmed the reliability of the developed model and proved that the protection of personal data is proportional to reliability and trus

    The method of discretization signals to minimize the fallibility of information recovery

    Get PDF
    The paper proposes a fundamentally new approach to the formulation of the problem of optimizing the discretization interval (frequency). The well-known traditional methods of restoring an analog signal from its discrete implementations consist of sequentially solving two problems: restoring the output signal from a discrete signal at the output of a digital block and restoring the input signal of an analog block from its output signal. However, this approach leads to methodical fallibility caused by interpolation when solving the first problem and by regularizing the equation when solving the second problem. The aim of the work is to develop a method for the signal discretization to minimize the fallibility of information recovery to determine the optimal discretization frequency.The proposed method for determining the optimal discretization rate makes it possible to exclude both components of the methodological fallibility in recovering information about the input signal. This was achieved due to the fact that to solve the reconstruction problem, instead of the known equation, a relation is used that connects the input signal of the analog block with the output discrete signal of the digital block.The proposed relation is devoid of instabilities inherent in the well-known equation. Therefore, when solving it, neither interpolation nor regularization is required, which means that there are no components of the methodological fallibility caused by the indicated operations. In addition, the proposed ratio provides a joint consideration of the properties of the interference in the output signal of the digital block and the frequency properties of the transforming operator, which allows minimizing the fallibility in restoring the input signal of the analog block and determining the optimal discretization frequency.A widespread contradiction in the field of signal information recovery from its discrete values has been investigated. A decrease in the discretization frequency below the optimal one leads to an increase in the approximation fallibility and the loss of some information about the input signal of the analog-to-digital signal processing device. At the same time, unjustified overestimation of the discretization rate, complicating the technical implementation of the device, is not useful, since not only does it not increase the information about the input signal, but, if necessary, its restoration leads to its decrease due to the increase in the effect of noise in the output signal on the recovery accuracy. input signal. The proposed method for signal discretization based on the minimum information recovery fallibility to determine the optimal discretization rate allows us to solve this contradiction

    Research of collision properties of the modified UMAC algorithm on crypto-code constructions

    Get PDF
    The transfer of information by telecommunication channels is accompanied by message hashing to control the integrity of the data and confirm the authenticity of the data. When using a reliable hash function, it is computationally difficult to create a fake message with a pre-existing hash code, however, due to the weaknesses of specific hashing algorithms, this threat can be feasible. To increase the level of cryptographic strength of transmitted messages over telecommunication channels, there are ways to create hash codes, which, according to practical research, are imperfect in terms of the speed of their formation and the degree of cryptographic strength. The collisional properties of hashing functions formed using the modified UMAC algorithm using the methodology for assessing the universality and strict universality of hash codes are investigated. Based on the results of the research, an assessment of the impact of the proposed modifications at the last stage of the generation of authentication codes on the provision of universal hashing properties was presented. The analysis of the advantages and disadvantages that accompany the formation of the hash code by the previously known methods is carried out. The scheme of cascading generation of data integrity and authenticity control codes using the UMAC algorithm on crypto-code constructions has been improved. Schemes of algorithms for checking hash codes were developed to meet the requirements of universality and strict universality. The calculation and analysis of collision search in the set of generated hash codes was carried out according to the requirements of a universal and strictly universal class for creating hash code

    Development of a hardware cryptosystem based on a random number generator with two types of entropy sources

    Get PDF
    In modern software, crypto-algorithms are widely used for both data encryption tasks, and authentication and integrity checks. There are well-known and proven crypto-algorithms. Their cryptoresistance is either mathematically proven or based on the need to solve a mathematically complex problem (factorization, discrete logarithm, etc.). On the other hand, in the computer world, information constantly appears about errors or «holes» in a particular program (including one that uses crypto-algorithms) or that it was broken (cracked). This creates distrust both in specific programs and in the possibility to protect something in general by cryptographic methods not only from special services, but also from ordinary hackers. A promising direction of research in this field is the implementation of a hybrid random number generator with two types of entropy sources in cryptosystems. The method and means of implementing a hybrid random number generator with two types of entropy sources: external – based on Zener diode noise and internal – based on the uncertainty state of the transistor-transistor logic structure are presented. One of the options for the practical implementation of a random number generator is presented, where two sources are used as a source of entropy: an external source – Zener diode noise and an internal source – the undefined state of the transistor-transistor logic structure. The functional diagram of the proposed random number generator with two types of entropy sources is given. The MATLAB/Simulink model of the proposed random number generator is built, the results of the statistical analysis of the generated random sequences by the NIST SP 800-22 test package are given

    КЛАСИФІКАТОР КІБЕРЗАГРОЗ ІНФОРМАЦІЙНИХ РЕСУРСІВ АВТОМАТИЗОВАНИХ БАНКІВСЬКИХ СИСТЕМ

    Get PDF
    Сучасний розвиток високих технологій та обчислювальної техніки значно посилив розвиток автоматизованих банківських систем (АБС) організацій банківського сектору (ОБС) та дозволив синтезувати інформаційні і комунікаційні технології щодо їх формування. Однак ера високих технологій поширила спектр загроз на банківські інформаційні ресурси (БІР), загрози набули ознак гібридності та синергізму. В цих умовах актуальним питанням при формуванні системи управління інформаційної безпекою (СУІБ) в організаціях банківського сектору є формування і аналіз сучасних загроз. З метою узагальнення підходу класифікації гібридних кіберзагроз на складові безпеки: інформаційну безпеку (ІБ), кібербезпеку (КБ), безпеку інформації (БІ) банківських інформаційних ресурсів в роботі пропонується удосконалений класифікатор загроз банківських інформаційних ресурсів з урахуванням рівнів моделі ISO\OSI в автоматизованих банківських системах, направленості загроз на послуги безпеки та їх критичності збитку. В статі проаналізовані сучасні міжнародні стандарті і нормативні документи Національного банку України з питань безпеки банківських інформаційних ресурсів. На основі проведеного аналізу пропонуються оцінки показників ступеня небезпеки зловмисників і ступеня реалізації захисних заходів в умовах дії сучасних гібридних кіберзагроз

    Підвищення продуктивності генерації випадкових послідовностей для систем захисту інформації

    Get PDF
    The ways of enhancement of productivity of generation of random sequences, derived from physical sources for information protection systems were substantiated. This is necessary because today there is a rapid growth of technological capabilities and of rate indicators of implementation of various information services and applications, required by community. One of the main issues of the safe use of these services is to ensure information security, which requires the use of effective high­rate information protection systems and high­performance generation of random data sequences. In the course of conducting research with the aim of enhancing productivity, the features of conversion of actual noise processes, taking into consideration their non­stationarity and deviations from the probability distribution were analyzed. We proposed the ways to improve the methods of analog­to­digital conversion with the optimization of the scale dynamic range quantization and the pitch of discretization of a noise process over time. With a view to aligning statistical characteristics, the possibility of using the processing methods that enhance its statistical quality with economy of high­rate losses was explored. These are the method of sampling equally probable combinations (von Neumann – Elias –Ryabko – Matchikina) and the method of code processing (Santha – Vazirani) that provide an increased effectiveness due to code extension and involve conversion of the sequence: in the first method, with the use of equally probable combinations with rejection of unnecessary data; in the second method, without their rejection with the possibility of linear conversion. In order to optimize the conversion parameters at both stages of generation and to adapt these parameters to the peculiarities and changeability of characteristics of converted random processes, it was proposed to use feedbacks of converters’ outputs with previous conversion elements. The adjustment of the specified parameters can be made during the generation based on the results of statistical analysis of the outputs of conversion stages. The obtained results are quite important, since their implementation in modern information protection systems will enable guaranteeing information security and safe usage of applications of the modern information service and the introduction of new applications.Обоснованы пути повышения производительности генерации случайных последовательностей, образованные от физических источников, для систем защиты информации. Это нужно потому, что на сегодняшний день происходит бурный рост технологических возможностей и скоростных показателей реализации различных информационных сервисов и приложений, что требует сообщество. Одним из главных вопросов безопасного использования этих сервисов является обеспечение информационной безопасности, которая требует использования эффективных быстродействующих систем защиты информации и высокопроизводительной генерации последовательностей случайных данных. При проведении исследований, с целью повышения производительности, осуществлен анализ особенности преобразования реальных шумовых процессов с учетом их нестационарности и отклонений от распределения вероятностей. Предложены пути совершенствования методов аналого-цифрового преобразования с оптимизацией шкалы квантования динамического диапазона и шага дискретизации шумового процесса во времени. С целью выравнивания статистических характеристик рассмотрена возможность использования методов обработки, которые повышают ее статистическое качество с экономией скоростных потерь. Это метод выборки равновероятных комбинаций (von Neuman – Elias – Рябко – Мачикиної) и метод кодовой обработки (Santha – Vazirani), которые благодаря расширению кода обеспечивают некоторую эффективность, и заключается в преобразовании последовательности: в первом с использованием равновероятных комбинаций с отвержением ненужных данных, во втором без их отвержения с возможностью линейного преобразования. С целью оптимизации параметров преобразования на обоих этапах генерации и адаптации этих параметров к особенностям и сменности характеристик преобразуемых случайных процессов предложено использование обратных связей выходов преобразователей с предыдущими элементами преобразования. Корректировка указанных параметров может осуществляться при генерации по результатам статистического анализа выходов этапов преобразования. Полученные результаты являются достаточно важными, поскольку их реализация в современных системах защиты информации позволит гарантировать обеспечение информационной безопасности и безопасное использование приложений современного информационного сервиса и внедрение новых приложенийОбґрунтовано шляхи підвищення продуктивності генерації випадкових послідовностей, що утворені від фізичних джерел, для систем захисту інформації. Це потрібно тому, що на сьогоднішній день відбувається бурхливе зростання технологічних можливостей та швидкісних показників реалізації різноманітних інформаційних сервісів та додатків, що потребує спільнота. Одним з головних питань безпечного використання цих сервісів є гарантування інформаційної безпеки, яка вимагає використання ефективних швидкодіючих систем захисту інформації та високопродуктивної генерації послідовностей випадкових даних. При проведенні досліджень з метою підвищення продуктивності здійснено аналіз особливості перетворення реальних шумових процесів з врахуванням їх нестаціонарності та відхилень від розподілу ймовірностей. Запропоновано шляхи вдосконалення методів аналого-цифрового перетворення з оптимізацією шкали квантування динамічного діапазону та кроку дискретизації шумового процесу в часі. З метою вирівнювання статистичних характеристик розглянуто можливість використання методів обробки, які підвищують її статистичну якість з економією швидкісних втрат. Це метод вибірки рівноймовірних комбінацій (von Neuman – Elias – Рябко – Мачикиної) та метод кодової оброки (Santha – Vazirani), які завдяки розширення коду забезпечують певну ефективність та полягають в перетворенні послідовності: в першому з використанням рівноймовірніх комбінацій з відкиданням непотрібних даних, в другому без їх відкидання з можливістю лінійного перетворення. З метою оптимізації параметрів перетворення на обох етапах генерації та адаптації цих параметрів до особливостей і змінності характеристик перетворюваних випадкових процесів запропоновано використання зворотних зв'язків виходів перетворювачів з попередніми елементами перетворення. Коригування вказаних параметрів має здійснюватись під час генерації за результатами статистичного аналізу виходів етапів перетворення. Отримані результати є досить важливими, оскілки їх реалізація в сучасних системах захисту інформації дозволить гарантоване забезпечення інформаційної безпеки та безпечне використання додатків сучасного інформаційного сервісу та впровадження нових додаткі
    corecore